Computing 2-Step Predictions for Interval-Valued Finite Stationary Markov Chains

نویسندگان

  • Marcilia Andrade Campos
  • Graçaliz Pereira Dimuro
  • Antônio Carlos da Rocha Costa
  • Vladik Kreinovich
  • Felix da Cunha
چکیده

Markov chains are a useful tool for solving practical problems. In many real-life situations, we do not know the exact values of initial and transition probabilities; instead, we only know the intervals of possible values of these probabilities. Such interval-valued Markov chains were considered and analyzed by I. O. Kozine and L. V. Utkin in their Reliable Computing paper. In their paper, they propose an efficient algorithm for computing interval-valued probabilities of the future states. For the general case of non-stationary Markov chains, their algorithm leads to the exact intervals for the probabilities of future states. In the important case of stationary Markov chains, we can still apply their algorithm. For stationary chains, 1-step predictions are exact but 2-step predictions sometimes lead to intervals that are wider than desired. In this paper, we describe a modification of Kozine-Utkin algorithm that always produces exact 2-step predictions for stationary Markov chains. 1 Formulation of the Problem Markov chains: brief reminder. In many real-life systems ranging from weather to hardware to psychological systems, transition probabilities do not depend on the history, only on the current state. Such systems are naturally described as finite Markov chains; see, e.g., [2, 3, 8, 14, 15, 16, 17].

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

ON THE STATIONARY PROBABILITY DENSITY FUNCTION OF BILINEAR TIME SERIES MODELS: A NUMERICAL APPROACH

In this paper, we show that the Chapman-Kolmogorov formula could be used as a recursive formula for computing the m-step-ahead conditional density of a Markov bilinear model. The stationary marginal probability density function of the model may be approximated by the m-step-ahead conditional density for sufficiently large m.

متن کامل

Existence of Stationary Distributions for Finite - State Markov Chains

4.1 The Concept of Stochastic Equilibrium . . . . . . . . . . . . . . . . . . . . . . . . . . 1 4.2 Existence of Stationary Distributions for Finite-State Markov Chains . . . . . . . . . 3 4.3 Definitions and Simple Consequences . . . . . . . . . . . . . . . . . . . . . . . . . . . 5 4.4 A Test for Recurrence . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5 4.5 Proving Po...

متن کامل

Model-Checking Markov Chains in the Presence of Uncertainties

We investigate the problem of model checking Interval-valued Discrete-time Markov Chains (IDTMC). IDTMCs are discrete-time finite Markov Chains for which the exact transition probabilities are not known. Instead in IDTMCs, each transition is associated with an interval in which the actual transition probability must lie. We consider two semantic interpretations for the uncertainty in the transi...

متن کامل

Series Expansions for Finite-state Markov Chains

This paper provides series expansions of the stationary distribution of a finite Markov chain. This leads to an efficient numerical algorithm for computing the stationary distribution of a finite Markov chain. Numerical examples are given to illustrate the performance of the algorithm.

متن کامل

Measure-Valued Differentiation for Stationary Markov Chains

We study general state-space Markov chains that depend on a parameter, say, . Sufficient conditions are established for the stationary performance of such a Markov chain to be differentiable with respect to . Specifically, we study the case of unbounded performance functions and thereby extend the result on weak differentiability of stationary distributions of Markov chains to unbounded mapping...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2004